Projection Theorems of Divergences and Likelihood Maximization Methods
نویسندگان
چکیده
Projection theorems of divergences enable us to find reverse projection of a divergence on a specific statistical model as a forward projection of the divergence on a different but rather “simpler” statistical model, which, in turn, results in solving a system of linear equations. Reverse projection of divergences are closely related to various estimation methods such as the maximum likelihood estimation or its variants in robust statistics. We consider projection theorems of three parametric families of divergences that are widely used in robust statistics, namely the Rényi divergences (or the Cressie-Reed Power Divergences), Density Power Divergences, and the Relative α-Entropy (or the Logarithmic Density Power Divergences). We explore these projection theorems from the usual likelihood maximization approach and from the principle of sufficiency. In particular, we show the equivalence of solving the estimation problems by the projection theorems of the respective divergences and by directly solving the corresponding estimating equations.
منابع مشابه
Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization
We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Betaand Gam...
متن کاملLearning mixtures by simplifying kernel density estimators
Gaussian mixture models are a widespread tool for modeling various and complex probability density functions. They can be estimated by various means, often using Expectation-Maximization or Kernel Density Estimation. In addition to these well known algorithms, new and promising stochastic modeling methods include Dirichlet Process mixtures and k-Maximum Likelihood Estimators. Most of the method...
متن کاملThe Development of Maximum Likelihood Estimation Approaches for Adaptive Estimation of Free Speed and Critical Density in Vehicle Freeways
The performance of many traffic control strategies depends on how much the traffic flow models have been accurately calibrated. One of the most applicable traffic flow model in traffic control and management is LWR or METANET model. Practically, key parameters in LWR model, including free flow speed and critical density, are parameterized using flow and speed measurements gathered by inductive ...
متن کاملOn The Equivalence of Projections In Relative $\alpha$-Entropy and R\'enyi Divergence
The aim of this work is to establish that two recently published projection theorems, one dealing with a parametric generalization of relative entropy and another dealing with Rényi divergence, are equivalent under a correspondence on the space of probability measures. Further, we demonstrate that the associated “Pythagorean” theorems are equivalent under this correspondence. Finally, we apply ...
متن کاملA Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models
Estimators derived from a divergence criterion such as φ−divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estima...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1705.09898 شماره
صفحات -
تاریخ انتشار 2017